Compare Page

Uniqueness

Characteristic Name: Uniqueness
Dimension: Consistency
Description: The data is uniquely identifiable
Granularity: Record
Implementation Type: Rule-based approach
Characteristic Type: Declarative

Verification Metric:

The number of duplicate records reported per thousand records

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Ensure that every entity(record) is unique by implementing a key in every relation (1) Key constraint
Ensure that same entity is not recorded twice under different unique identifiers (1) Same customer is entered under different customer ID
Ensure that unique key is not-null at any cost (1) Employee ID which is the key of employee table is not null at any cost
In case of using bar codes standardise the bar code generation process to ensure that Bar codes are not reused (1) UPC

Validation Metric:

How mature is the creation and implementation of the DQ rules to maintain uniqueness of data records

These are examples of how the characteristic might occur in a database.

Example: Source:
A school has 120 current students and 380 former students (i.e. 500 in total) however; the Student database shows 520 different student records. This could include Fred Smith and Freddy Smith as separate records, despite there only being one student at the school named Fred Smith. This indicates a uniqueness of 500/520 x 100 = 96.2% N. Askham, et al., “The Six Primary Dimensions for Data Quality Assessment: Defining Data Quality Dimensions”, DAMA UK Working Group, 2013.
duplicate vendor records with the same name and different addresses make it difficult to ensure that payment is sent to the correct address. When purchases by one company are associated with duplicate master records, the credit limit for that company can unknowingly be exceeded. This can expose the business to unnecessary credit risks. D. McGilvray, “Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information”, Morgan Kaufmann Publishers, 2008.
on two maps of the same date. Since events have a duration, this idea can be extended to identify events that exhibit temporal overlap. H. Veregin, “Data Quality Parameters” in P. A. Longley, M. F. Goodchild, D. J. Maguire, and D. W. Rhind (eds) Geographical Information Systems: Volume 1, Principles and Technical Issues. New York: John Wiley and Sons, 1999, pp. 177-89.
The patient’s identification details are correct and uniquely identify the patient. P. J. Watson, “Improving Data Quality: A Guide for Developing Countries”, World Health Organization, 2003.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
The entity is unique — there are no duplicate values. B. BYRNE, J. K., D. MCCARTY, G. SAUTER, H. SMITH, P WORCESTER 2008. The information perspective of SOA design Part 6:The value of applying the data quality analysis pattern in SOA. IBM corporation.
Asserting uniqueness of the entities within a data set implies that no entity exists more than once within the data set and that there is a key that can be used to uniquely access each entity. For example, in a master product table, each product must appear once and be assigned a unique identifier that represents that product across the client applications. LOSHIN, D. 2006. Monitoring Data quality Performance using Data Quality Metrics. Informatica Corporation.
Each real-world phenomenon is either represented by at most one identifiable data unit or by multiple but consistent identifiable units or by multiple identifiable units whose inconsistencies are resolved within an acceptable time frame. PRICE, R. J. & SHANKS, G. Empirical refinement of a semiotic information quality framework. System Sciences, 2005. HICSS'05. Proceedings of the 38th Annual Hawaii International Conference on, 2005. IEEE, 216a-216a.

 

Standards and regulatory compliance

Characteristic Name: Standards and regulatory compliance
Dimension: Validity
Description: All data processing activities should comply with the policies, procedures, standards, industry benchmark practices and all regulatory requirements that the organization is bound by
Granularity: Information object
Implementation Type: Process-based approach
Characteristic Type: Usage

Verification Metric:

The number of tasks failed or under performed due non adherence of standards and regulations
The number of complaints received due to non adherence to standards and regulations

GuidelinesExamplesDefinitons

The implementation guidelines are guidelines to follow in regard to the characteristic. The scenarios are examples of the implementation

Guidelines: Scenario:
Identify the policies, procedures, standards, benchmark practices and any regulatory requirements that an Information object is bound by (1) Each person's compensation criteria must be determined in accordance with the Annuities Based on Retired or Retainer Pay law.
Ensure that all data processing activities are well defined and documented based on the policies, procedures, standards, benchmarks and regulatory requirements. (1) Process of making a damage estimate is well defined based on industry benchmarks
Ensure that the application programs cater for standards and regulatory compliance (1) A software program to make damage estimates which includes all benchmark data
Regularly monitor the data processing activities and identify the problems and inefficiencies so that the corrective and preventive actions can be taken. (1) Frequent delays in time sheet approvals results in delayed payments
Signs should be standardised and universally used (1) In the line efficiency report, low efficiency lines are indicated using a RED light while a green light indicates high efficiency
Relevant standard, procedures, policies and regulations should be communicated to the users effectively (1) Providing a guidelines for signs
Ensure that proper conversion tables are maintained and used in converting attribute vales to different measurement bases. (1) Metric conversion tables are used to convert lbs to kgs.

Validation Metric:

How mature is the process maintain the adherence to standards and regulations

These are examples of how the characteristic might occur in a database.

Example: Source:
The age at entry to a UK primary & junior school is captured on the form for school applications. This is entered into a database and checked that it is between 4 and 11. If it were captured on the form as 14 or N/A it would be rejected as invalid. N. Askham, et al., “The Six Primary Dimensions for Data Quality Assessment: Defining Data Quality Dimensions”, DAMA UK Working Group, 2013.

The Definitions are examples of the characteristic that appear in the sources provided.

Definition: Source:
A measure of the existence, completeness, quality, and documentation of data standards, data models, business rules, metadata, and reference data. D. McGilvray, “Executing Data Quality Projects: Ten Steps to Quality Data and Trusted Information”, Morgan Kaufmann Publishers, 2008.
The data element has a commonly agreed upon enterprise business definition and calculations. B. BYRNE, J. K., D. MCCARTY, G. SAUTER, H. SMITH, P WORCESTER 2008. The information perspective of SOA design Part 6:The value of applying the data quality analysis pattern in SOA. IBM corporation.
SIGNS AND OTHER Information-Bearing Mechanisms like Traffic Signals should be standardized and universally used across the broadest audience possible. ENGLISH, L. P. 2009. Information quality applied: Best practices for improving business information, processes and systems, Wiley Publishing.
Validity of data refers to data that has been collected in accordance with any rules or definitions that are applicable for that data. This will enable benchmarking between organisations and over time. HIQA 2011. International Review of Data Quality Health Information and Quality Authority (HIQA), Ireland. http://www.hiqa.ie/press-release/2011-04-28-international-review-data-quality.